Skip to main content
Log in

Imitation of Human Motion by Low Degree-of-Freedom Simulated Robots and Human Preference for Mappings Driven by Spinal, Arm, and Leg Activity

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

Robots cannot exactly replicate human motion, especially for low degree-of-freedom (DOF) robots, but perceptual imitation has been accomplished. Nevertheless, the multiple mappings between human and robot bodies continue to present questions around which aspect of human motion robots should preserve. In this vein, this paper presents a methodology for mapping human motion capture data to the motion of a low-DOF simulated robot; further, empirical experiments conducted on Amazon Mechanical Turk illuminate human preference across several such mappings. Users preferred motion capture driven robot motion over artificially generated robot motion, suggesting imitation was successfully accomplished by the proposed mappings. Moreover, one mapping, relating to the leaning of the spine, was preferred over arm- and leg-based mappings by a significant subgroup of respondents who were both loyal to the mapping—across multiple stimuli—and more engaged in the survey than other respondents. These results re-confirm the ability for simple robots to imitate human behavior and indicate that monitoring human spinal activity may be especially useful in this pursuit. Parallel work in psychology and human behavior analysis suggests that successful imitation of the motion of human counterparts is a necessary activity for robots to integrate in human-facing environments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Tomasello M, Kruger AC, Ratner HH (1993) Cultural learning. Behav Brain Sci 16(3):495–511

    Article  Google Scholar 

  2. Breazeal C, Scassellati B (2002) Robots that imitate humans. Trends Cognit Sci 6(11):481–487

    Article  Google Scholar 

  3. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42(3–4):143–166

    Article  Google Scholar 

  4. Kaushik R, Vidrin I, LaViers A (2018) Quantifying coordination in human dyads via a measure of verticality. In: Proceedings of the 5th international conference on movement and computing - MOCO ’18. ACM Press, Genoa, Italy, pp 1–8. https://doi.org/10.1145/3212721.3212805

  5. Kaushik R, LaViers A (2018) Imitating human movement using a measure of verticality to animate low degree-of-freedom non-humanoid virtual characters. Soc Robot Springer Int Publ Cham 11357:588–598. https://doi.org/10.1007/978-3-030-05204-1_58

    Article  Google Scholar 

  6. Boker SM, Rotondo JL (2002) Symmetry building and symmetry breaking in synchronized movement. Mirror Neurons Evol Brain Language 42:163

    Article  Google Scholar 

  7. Boker SM, Cohn JF, Theobald BJ, Matthews I, Brick TR, Spies JR (2009) Effects of damping head movement and facial expression in dyadic conversation using real-time facial expression tracking and synthesized avatars. Philos Trans Royal Soc B Biol Sci 364(1535):3485–3495

    Article  Google Scholar 

  8. Ashenfelter KT, Boker SM, Waddell JR, Vitanov N (2009) Spatiotemporal symmetry and multifractal structure of head movements during dyadic conversation. J Exp Psychol Hum Percept Perform 35(4):1072

    Article  Google Scholar 

  9. Liu C, Ishi CT, Ishiguro H, Hagita N (2012) Generation of nodding, head tilting and eye gazing for human–robot dialogue interaction. In: 2012 7th ACM/IEEE international conference on human–robot interaction (HRI), IEEE, pp 285–292

  10. Johnson DO, Cuijpers RH (2019) Investigating the effect of a humanoid robot’s head position on imitating human emotions. Int J Soc Robot 11(1):65–74

    Article  Google Scholar 

  11. Mielke EA, Townsend EC, Killpack MD (2017) Analysis of rigid extended object co-manipulation by human dyads: lateral movement characterization. arXiv preprint arXiv:1702.00733

  12. Melnyk A, Hénaff P (2019) Physical analysis of handshaking between humans: mutual synchronisation and social context. Int J Soc Robot. https://doi.org/10.1007/s12369-019-00525-y

    Article  Google Scholar 

  13. Woolford K (2014) Capturing human movement in the wild. In: Proceedings of the 2014 international workshop on movement and computing, ACM, p 19

  14. Arvind D, Valtazanos A (2009) Speckled tango dancers: Real-time motion capture of two-body interactions using on-body wireless sensor networks. In: Sixth international workshop on wearable and implantable body sensor networks, IEEE, 2009. BSN 2009. pp 312–317

  15. Brown C, Paine G (2015) Interactive tango milonga: Designing internal experience. In: Proceedings of the 2nd international workshop on movement and computing, ACM, pp 17–20

  16. Li W, Pasquier P (2016) Automatic affect classification of human motion capture sequences in the Valence–Arousal model. In: Proceedings of the 3rd international symposium on movement and computing, ACM, p 15

  17. Shiratori T, Nakazawa A, Ikeuchi K (2006) Synthesizing dance performance using musical and motion features. In: Proceedings 2006 IEEE international conference on robotics and automation, 2006. ICRA 2006, IEEE, pp 3654–3659

  18. Kingston P, Egerstedt M (2011) Motion preference learning. In: American control conference (ACC), 2011, IEEE, pp 3819–3824

  19. Pomplun M, Mataric MJ (2000) Evaluation metrics and results of human arm movement imitation. In: Proceedings, first IEEE-RAS international conference on humanoid robotics (Humanoids-2000), pp 7–8

  20. Johnson DO, Cuijpers RH, Pollmann K, van de Ven AA (2016) Exploring the entertainment value of playing games with a humanoid robot. Int J Soc Robot 8(2):247–269

    Article  Google Scholar 

  21. Torta E, van Heumen J, Piunti F, Romeo L, Cuijpers R (2015) Evaluation of unimodal and multimodal communication cues for attracting attention in human–robot interaction. Int J Soc Robot 7(1):89–96

    Article  Google Scholar 

  22. Noy L, Dekel E, Alon U (2011) The mirror game as a paradigm for studying the dynamics of two people improvising motion together. Proc Natl Acad Sci 108(52):20947–20952

    Article  Google Scholar 

  23. Slowinski P, Rooke E, Di Bernardo M, Tanaseva-Atanasova K (2014) Kinematic characteristics of motion in the mirror game. In: 2014 IEEE international conference on systems, man, and cybernetics (SMC), IEEE, pp 748–753

  24. Maeda G, Ewerton M, Lioutikov R, Amor HB, Peters J, Neumann G (2014) Learning interaction for collaborative tasks with probabilistic movement primitives. In: 2014 14th IEEE-RAS international conference on humanoid robots (humanoids), IEEE, pp 527–534

  25. Ewerton M, Neumann G, Lioutikov R, Amor HB, Peters J, Maeda G (2015) Learning multiple collaborative tasks with a mixture of interaction primitives. In: 2015 IEEE international conference on robotics and automation (ICRA), IEEE, pp 1535–1542

  26. Cuykendall S, Schiphorst T, Bizzocchi J (2014) Designing interaction categories for kinesthetic empathy: a case study of synchronous objects. In: Proceedings of the 2014 international workshop on movement and computing, ACM, p 13

  27. Özcimder K, Dey B, Lazier RJ, Trueman D, Leonard NE (2016) Investigating group behavior in dance: an evolutionary dynamics approach. In: American control conference (ACC), 2016, IEEE, pp 6465–6470

  28. Yamane K, Ariki Y, Hodgins J (2010) Animating non-humanoid characters with human motion data. In: Proceedings of the 2010 ACM siggraph/eurographics symposium on computer animation, eurographics association, pp 169–178

  29. Abdul-Massih M, Yoo I, Benes B (2017) Motion style retargeting to characters with different morphologies. Comput Gr Forum Wiley Online Libr 36:86–99

    Article  Google Scholar 

  30. Seol Y, O’Sullivan C, Lee J (2013) Creature features: online motion puppetry for non-human characters. In: Proceedings of the 12th ACM siggraph/eurographics symposium on computer animation, ACM, pp 213–221

  31. Bevacqua E, Richard R, Soler J, De Loor P (2016) INGREDIBLE: a platform for full body interaction between human and virtual agent that improves co-presence. In: Proceedings of the 3rd international symposium on movement and computing, ACM, p 22

  32. Tang JK, Chan JC, Leung H (2011) Interactive dancing game with real-time recognition of continuous dance moves from 3D human motion capture. In: Proceedings of the 5th international conference on ubiquitous information management and communication, ACM, p 50

  33. Zhai C, Alderisio F, Słowiński P, Tsaneva-Atanasova K, di Bernardo M (2016) Design of a virtual player for joint improvisation with humans in the mirror game. PloS One 11(4):e0154361

    Article  Google Scholar 

  34. Zhai C, Alderisio F, Slowinski P, Tsaneva-Atanasova K, di Bernardo M (2015) Modeling joint improvisation between human and virtual players in the mirror game. arXiv preprint arXiv:1512.05619

  35. McCormick J, Vincs K, Nahavandi S, Creighton D, Hutchison S (2014) Teaching a digital performing agent: artificial neural network and hidden markov model for recognising and performing dance movement. In: Proceedings of the 2014 international workshop on movement and computing, ACM, p 70

  36. Gillies M, Brenton H, Kleinsmith A (2015) Embodied design of full bodied interaction with virtual humans. In: Proceedings of the 2nd international workshop on movement and computing, ACM, pp 1–8

  37. Yamane K (2016) Human motion tracking by robots. In: Dance notations and robot motion, Springer, pp 417–430

  38. Ott C, Lee D, Nakamura Y (2008) Motion capture based human motion recognition and imitation by direct marker control. In: 8th IEEE-RAS international conference on humanoid robots, 2008. Humanoids 2008. IEEE, pp 399–405

  39. Minato T, Ishiguro H (2007) Generating natural posture in an android by mapping human posture in three-dimensional position space. In: IEEE/RSJ international conference on intelligent robots and systems, 2007. IROS 2007. IEEE, pp 609–616

  40. Fujimoto I, Matsumoto T, De Silva PRS, Kobayashi M, Higashi M (2011) Mimicking and evaluating human motion to improve the imitation skill of children with autism through a robot. Int J Soc Robot 3(4):349–357

    Article  Google Scholar 

  41. Nakaoka S, Nakazawa A, Yokoi K, Hirukawa H, Ikeuchi K (2003) Generating whole body motions for a biped humanoid robot from captured human dances. In: 2003 IEEE international conference on robotics and automation (Cat. No. 03CH37422), IEEE, vol 3, pp 3905–3910

  42. Demiris Y, Johnson M (2003) Distributed, predictive perception of actions: a biologically inspired robotics architecture for imitation and learning. Connect Sci 15(4):231–243

    Article  Google Scholar 

  43. Billard A, Matarić MJ (2001) Learning human arm movements by imitation: evaluation of a biologically inspired connectionist architecture. Robot Auton Syst 37(2–3):145–160

    Article  Google Scholar 

  44. Suleiman W, Yoshida E, Kanehiro F, Laumond JP, Monin A (2008) On human motion imitation by humanoid robot. In: 2008 IEEE international conference on robotics and automation, IEEE, pp 2697–2704

  45. Nehaniv C, Dautenhahn K (1998) Mapping between dissimilar bodies: a ordances and the algebraic foundations of imitation. EWLR-98, pp 64–72

  46. Alissandrakis A, Nehaniv CL, Dautenhahn K, Saunders J (2006) Evaluation of robot imitation attempts: comparison of the system’s and the human’s perspectives. In: Proceeding of the 1st ACM sigchi/sigart conference on human–robot interaction - HRI ’06, ACM Press, Salt Lake City, p 134. https://doi.org/10.1145/1121241.1121265

  47. Van de Perre G, De Beir A, Cao HL, Esteban PG, Lefeber D, Vanderborght B (2019) Studying design aspects for social robots using a generic gesture method. Int J Soc Robot. https://doi.org/10.1007/s12369-019-00518-x

    Article  Google Scholar 

  48. Simmons R, Knight H (2017) Keep on dancing: effects of expressive motion mimicry. In: 2017 26th IEEE international symposium on robot and human interactive communication (RO-MAN), IEEE, pp 720–727

  49. Knight H, Simmons R (2016) Laban head-motions convey robot state: a call for robot body language. In: 2016 IEEE international conference on robotics and automation (ICRA), IEEE, pp 2881–2888

  50. Knight H (2016) Expressive motion for low degree-of-freedom robots

  51. Sharma M, Hildebrandt D, Newman G, Young JE, Eskicioglu R (2013) Communicating affect via flight path: exploring use of the laban effort system for designing affective locomotion paths. In: Proceedings of the 8th ACM/IEEE international conference on human-robot interaction, IEEE Press, pp 293–300

  52. Dragan AD, Ratliff ND, Srinivasa SS (2011) Manipulation planning with goal sets using constrained trajectory optimization. In: 2011 IEEE international conference on robotics and automation, IEEE, pp 4582–4588

  53. Chan MT, Gorbet R, Beesley P, Kulić D (2016) Interacting with curious agents: user experience with interactive sculptural systems. In: 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN), IEEE, pp 151–158

  54. Wang H, Kosuge K (2012) Control of a robot dancer for enhancing haptic human-robot interaction in waltz. IEEE Trans Haptics 5(3):264–273

    Article  Google Scholar 

  55. Baillieul J, Özcimder K (2012) The control theory of motion-based communication: problems in teaching robots to dance. In: American control conference (ACC), 2012, IEEE, pp 4319–4326

  56. Takeda T, Hirata Y, Kosuge K (2007) Dance partner robot cooperative motion generation with adjustable length of dance step stride based on physical interaction. In: IEEE/RSJ international conference on intelligent robots and systems, 2007. IROS 2007. IEEE, pp 3258–3263

  57. Hölldampf J, Peer A, Buss M (2010) Synthesis of an interactive haptic dancing partner. In: RO-MAN, 2010 IEEE, IEEE, pp 527–532

  58. Johnson DO, Cuijpers RH, Juola JF, Torta E, Simonov M, Frisiello A, Bazzani M, Yan W, Weber C, Wermter S (2014) Socially assistive robots: a comprehensive approach to extending independent living. Int J Soc Robot 6(2):195–211

    Article  Google Scholar 

  59. Torta E, Werner F, Johnson DO, Juola JF, Cuijpers RH, Bazzani M, Oberzaucher J, Lemberger J, Lewy H, Bregman J (2014) Evaluation of a small socially-assistive humanoid robot in intelligent homes for the care of the elderly. J Intell Robot Syst 76(1):57–71

    Article  Google Scholar 

  60. Van Dijk ET, Torta E, Cuijpers RH (2013) Effects of eye contact and iconic gestures on message retention in human–robot interaction. Int J Soc Robot 5(4):491–501

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank Erin Berl for providing the human movement used to generate the mappings for this study.

Funding

This work was supported by DARPA Grant #D16AP00001.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roshni Kaushik.

Ethics declarations

Conflict of interest

Amy LaViers owns stock in AE Machines, Inc. and caali, LLC.

Research Involving Human Participants and/or Animals

Studies with human subjects were governed by IRB #16225.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

See Tables 5, 6, 7, 8, 9 and 10.

Table 5 Age of participants
Table 6 Gender of participants
Table 7 Native language of participants
Table 8 Years speaking english of participants
Table 9 Highest level of education completed of participants
Table 10 Experience level of participants in various movement activities

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kaushik, R., LaViers, A. Imitation of Human Motion by Low Degree-of-Freedom Simulated Robots and Human Preference for Mappings Driven by Spinal, Arm, and Leg Activity. Int J of Soc Robotics 11, 765–782 (2019). https://doi.org/10.1007/s12369-019-00595-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-019-00595-y

Keywords

Navigation